barren plateau
Escaping from the Barren Plateau via Gaussian Initializations in Deep Variational Quantum Circuits
Variational quantum circuits have been widely employed in quantum simulation and quantum machine learning in recent years. However, quantum circuits with random structures have poor trainability due to the exponentially vanishing gradient with respect to the circuit depth and the qubit number. This result leads to a general standpoint that deep quantum circuits would not be feasible for practical tasks. In this work, we propose an initialization strategy with theoretical guarantees for the vanishing gradient problem in general deep quantum circuits. Specifically, we prove that under proper Gaussian initialized parameters, the norm of the gradient decays at most polynomially when the qubit number and the circuit depth increase. Our theoretical results hold for both the local and the global observable cases, where the latter was believed to have vanishing gradients even for very shallow circuits. Experimental results verify our theoretical findings in quantum simulation and quantum chemistry.
- Information Technology > Hardware (1.00)
- Information Technology > Artificial Intelligence > Machine Learning (0.78)
Escaping from the Barren Plateau via Gaussian Initializations in Deep Variational Quantum Circuits
Variational quantum circuits have been widely employed in quantum simulation and quantum machine learning in recent years. However, quantum circuits with random structures have poor trainability due to the exponentially vanishing gradient with respect to the circuit depth and the qubit number. This result leads to a general standpoint that deep quantum circuits would not be feasible for practical tasks. In this work, we propose an initialization strategy with theoretical guarantees for the vanishing gradient problem in general deep quantum circuits. Specifically, we prove that under proper Gaussian initialized parameters, the norm of the gradient decays at most polynomially when the qubit number and the circuit depth increase.
Energy-dependent barren plateau in bosonic variational quantum circuits
Zhang, Bingzhi, Zhuang, Quntao
Variational quantum circuits (VQCs) [1] are candidates for achieving practical quantum advantages in the noisy intermediate-scale quantum (NISQ) era [2], when scalable error-corrected quantum computers are not yet available. VQCs utilize classical control to optimize a quantum circuit to solve computation problems, including optimization [3], eigen-system problem [4-10], partial-differential equations [11], quantum simulation [12-14] and machine learning [15-23]. As a general approach of designing quantum circuits, it has also found applications in the approximation [24], preparation [25, 26], classification [27-31] and tomography [32] of quantum states. Initial works on VQCs concern discrete-variable (DV) finite-dimensional systems of qubits, which are natural for computation; while continous-variable (CV) systems of bosonic qumodes are less explored. Yet, many important quantum systems are modelled by qumodes. For example, quantum communication and networking [33-37] rely on photons--the only flying quantum information carrier. In this regard, quantum transduction and entanglement distillation are shown to be enhanced by CV VQCs [38]; Photonic quantum computers [39, 40] are also relying on bosonic encoding such as the cat code and Gottesman-Kitaev-Preskill (GKP) code [41], which has shown great promise [42, 43]. The engineering of such code states are greatly boosted by CV VQCs [44-47]; Finally, distributed entangled sensor networks ubiquitously rely on CV VQCs to achieve quantum advantages in sensing [48-51] and data classification [52, 53]. Different from traditional algorithms, the runtime of VQCs is characterized by the time necessary to train the variational parameters to optimize a cost function.
- Information Technology > Hardware (1.00)
- Information Technology > Artificial Intelligence > Machine Learning (1.00)
Policy Gradient Approach to Compilation of Variational Quantum Circuits
We propose a method for finding approximate compilations of quantum unitary transformations, based on techniques from policy gradient reinforcement learning. The choice of a stochastic policy allows us to rephrase the optimization problem in terms of probability distributions, rather than variational gates. In this framework, the optimal configuration is found by optimizing over distribution parameters, rather than over free angles. We show numerically that this approach can be more competitive than gradient-free methods, for a comparable amount of resources, both for noiseless and noisy circuits. Another interesting feature of this approach to variational compilation is that it does not need a separate register and long-range interactions to estimate the end-point fidelity, which is an improvement over methods which rely on the Hilbert-Schmidt test. We expect these techniques to be relevant for training variational circuits in other contexts.
- Information Technology > Hardware (1.00)
- Information Technology > Artificial Intelligence > Machine Learning > Reinforcement Learning (1.00)
- Information Technology > Artificial Intelligence > Machine Learning > Neural Networks (0.93)
- Information Technology > Artificial Intelligence > Representation & Reasoning > Optimization (0.88)
Equivalence of quantum barren plateaus to cost concentration and narrow gorges
Arrasmith, Andrew, Holmes, Zoë, Cerezo, M., Coles, Patrick J.
Optimizing parameterized quantum circuits (PQCs) is the leading approach to make use of near-term quantum computers. However, very little is known about the cost function landscape for PQCs, which hinders progress towards quantum-aware optimizers. In this work, we investigate the connection between three different landscape features that have been observed for PQCs: (1) exponentially vanishing gradients (called barren plateaus), (2) exponential cost concentration about the mean, and (3) the exponential narrowness of minina (called narrow gorges). We analytically prove that these three phenomena occur together, i.e., when one occurs then so do the other two. A key implication of this result is that one can numerically diagnose barren plateaus via cost differences rather than via the computationally more expensive gradients. More broadly, our work shows that quantum mechanics rules out certain cost landscapes (which otherwise would be mathematically possible), and hence our results are interesting from a quantum foundations perspective.
- North America > United States > New Mexico > Los Alamos County > Los Alamos (0.05)
- Asia > Japan (0.04)
Deep Machine Learning Reconstructing Lattice Topology with Strong Thermal Fluctuations
Wang, Xiao-Han, Shi, Pei, Xi, Bin, Hu, Jie, Ran, Shi-Ju
Applying artificial intelligence to scientific problems (namely AI for science) is currently under hot debate. However, the scientific problems differ much from the conventional ones with images, texts, and etc., where new challenges emerges with the unbalanced scientific data and complicated effects from the physical setups. In this work, we demonstrate the validity of the deep convolutional neural network (CNN) on reconstructing the lattice topology (i.e., spin connectivities) in the presence of strong thermal fluctuations and unbalanced data. Taking the kinetic Ising model with Glauber dynamics as an example, the CNN maps the time-dependent local magnetic momenta (a single-node feature) evolved from a specific initial configuration (dubbed as an evolution instance) to the probabilities of the presences of the possible couplings. Our scheme distinguishes from the previous ones that might require the knowledge on the node dynamics, the responses from perturbations, or the evaluations of statistic quantities such as correlations or transfer entropy from many evolution instances. The fine tuning avoids the "barren plateau" caused by the strong thermal fluctuations at high temperatures. Accurate reconstructions can be made where the thermal fluctuations dominate over the correlations and consequently the statistic methods in general fail. Meanwhile, we unveil the generalization of CNN on dealing with the instances evolved from the unlearnt initial spin configurations and those with the unlearnt lattices. We raise an open question on the learning with unbalanced data in the nearly "double-exponentially" large sample space.
Laziness, Barren Plateau, and Noise in Machine Learning
Liu, Junyu, Lin, Zexi, Jiang, Liang
We define \emph{laziness} to describe a large suppression of variational parameter updates for neural networks, classical or quantum. In the quantum case, the suppression is exponential in the number of qubits for randomized variational quantum circuits. We discuss the difference between laziness and \emph{barren plateau} in quantum machine learning created by quantum physicists in \cite{mcclean2018barren} for the flatness of the loss function landscape during gradient descent. We address a novel theoretical understanding of those two phenomena in light of the theory of neural tangent kernels. For noiseless quantum circuits, without the measurement noise, the loss function landscape is complicated in the overparametrized regime with a large number of trainable variational angles. Instead, around a random starting point in optimization, there are large numbers of local minima that are good enough and could minimize the mean square loss function, where we still have quantum laziness, but we do not have barren plateaus. However, the complicated landscape is not visible within a limited number of iterations, and low precision in quantum control and quantum sensing. Moreover, we look at the effect of noises during optimization by assuming intuitive noise models, and show that variational quantum algorithms are noise-resilient in the overparametrization regime. Our work precisely reformulates the quantum barren plateau statement towards a precision statement and justifies the statement in certain noise models, injects new hope toward near-term variational quantum algorithms, and provides theoretical connections toward classical machine learning. Our paper provides conceptual perspectives about quantum barren plateaus, together with discussions about the gradient descent dynamics in \cite{together}.
- North America > United States > Illinois > Cook County > Chicago (0.05)
- Asia > British Indian Ocean Territory > Diego Garcia (0.04)
Breakthrough proof clears path for quantum AI: Novel theorem demonstrates convolutional neural networks can always be trained on quantum computers, overcoming threat of 'barren plateaus' in optimization problems
"The way you construct a quantum neural network can lead to a barren plateau -- or not," said Marco Cerezo, coauthor of the paper titled "Absence of Barren Plateaus in Quantum Convolutional Neural Networks," published today by a Los Alamos National Laboratory team in Physical Review X. Cerezo is a physicist specializing in quantum computing, quantum machine learning, and quantum information at Los Alamos. "We proved the absence of barren plateaus for a special type of quantum neural network. Our work provides trainability guarantees for this architecture, meaning that one can generically train its parameters." As an artificial intelligence (AI) methodology, quantum convolutional neural networks are inspired by the visual cortex. As such, they involve a series of convolutional layers, or filters, interleaved with pooling layers that reduce the dimension of the data while keeping important features of a data set.
Los Alamos National Laboratory breakthrough heralds Quantum AI
Convolutional neural networks running on quantum computers have generated significant buzz for their potential to analyse quantum data better than classical computers can. While a fundamental solvability problem known as "barren plateaus" has limited the application of these neural networks for large data sets, new research overcomes that Achilles heel with a rigorous proof that guarantees scalability. "The way you construct a quantum neural network can lead to a barren plateau--or not," said Marco Cerezo, co-author of the paper titled "Absence of Barren Plateaus in Quantum Convolutional Neural Networks," published today by a Los Alamos National Laboratory team in Physical Review X. Cerezo is a physicist specializing in quantum computing, quantum machine learning, and quantum information at Los Alamos. "We proved the absence of barren plateaus for a special type of quantum neural network. Our work provides trainability guarantees for this architecture, meaning that one can generically train its parameters."
Breakthrough proof clears path for quantum AI
Convolutional neural networks running on quantum computers have generated significant buzz for their potential to analyze quantum data better than classical computers can. While a fundamental solvability problem known as "barren plateaus" has limited the application of these neural networks for large data sets, new research overcomes that Achilles heel with a rigorous proof that guarantees scalability. "The way you construct a quantum neural network can lead to a barren plateau--or not," said Marco Cerezo, co-author of the paper titled "Absence of Barren Plateaus in Quantum Convolutional Neural Networks," published today by a Los Alamos National Laboratory team in Physical Review X. Cerezo is a physicist specializing in quantum computing, quantum machine learning, and quantum information at Los Alamos. "We proved the absence of barren plateaus for a special type of quantum neural network. Our work provides trainability guarantees for this architecture, meaning that one can generically train its parameters."